[SPARK-42809][BUILD] Upgrade scala-maven-plugin from 4.8.0 to 4.8.1#40442
[SPARK-42809][BUILD] Upgrade scala-maven-plugin from 4.8.0 to 4.8.1#40442panbingkun wants to merge 1 commit intoapache:masterfrom
Conversation
|
4.8.1 fixed a compilation issue with Scala 3.x after 4.7.2 |
|
Merged to master. |
|
I do have some problems with this upgrade. with <scala-maven-plugin.version>4.8.0</scala-maven-plugin.version>./build/mvn -DskipTests clean package [INFO] Reactor Summary for Spark Project Parent POM 3.5.0-SNAPSHOT: with <scala-maven-plugin.version>4.8.1</scala-maven-plugin.version>./build/mvn -DskipTests clean package [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ spark-core_2.12 --- |
|
hmm... I run So, in what environment did you fail to compile? @bjornjorgensen |
|
hmm.. ok. and the other that have the same problem it's ubuntu with openjdk 17. This one is running on github https://github.com/bjornjorgensen/jupyter-spark-master-docker/actions/runs/4448461787/jobs/7811305871#step:7:42144 |
|
When use Java 17.0.6 with 4.8.1, I can reproduce this issue also cc @HyukjinKwon @panbingkun |
|
I think GA can pass due to |
|
./build/mvn -DskipTests -Djava.version=17 clean package [INFO] --- scala-maven-plugin:4.8.1:compile (scala-compile-first) @ spark-tags_2.12 --- |
@bjornjorgensen You need to make sure that the Java you're using is really 17(should be 2.12.17__61.0-1.8.0, not 55.0) |
|
archlinux-java status This problem is why I wrote the email "Failed to build master google protobuf protoc-3.22.0-linux-x86_64.exe" to dev@spark.org |
|
I open a ticket for this at davidB/scala-maven-plugin#684 |
Next week, I will try to find a minimum case to reproducer problem. |
|
@HyukjinKwon @bjornjorgensen @panbingkun In addition to the issues @bjornjorgensen mentioned, when compiling using Java 8+Scala 2.13, I saw compilation errors as shown in the figure: More, I saw davidB/scala-maven-plugin#686, So I think we should revert this pr @HyukjinKwon |
|
#40482 I give a pr to revert this one |
… 4.8.1" ### What changes were proposed in this pull request? This pr aims to Revert "[SPARK-42809][BUILD] Upgrade scala-maven-plugin from 4.8.0 to 4.8.1". ### Why are the changes needed? As mentioned in #40442, there are some regression with the 4.8.1: - Run `./build/mvn -DskipTests clean package` with Java 17 will failed ``` [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) spark-core_2.12 --- [INFO] Not compiling main sources [INFO] [INFO] --- scala-maven-plugin:4.8.1:compile (scala-compile-first) spark-core_2.12 --- [INFO] Compiler bridge file: /home/bjorn/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.8.0-bin_2.12.17__55.0-1.8.0_20221110T195421.jar [INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.17,1.7.10,null) [INFO] compiling 597 Scala sources and 103 Java sources to /home/bjorn/github/spark/core/target/scala-2.12/classes ... [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: not found: value sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: not found: object sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: not found: object sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210: not found: type Unsafe [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212: not found: type Unsafe [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452: not found: value sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: not found: object sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type SignalHandler [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83: not found: type Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: type SignalHandler [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: value Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114: not found: type Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116: not found: value Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128: not found: value Signal [ERROR] 19 errors found [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for Spark Project Parent POM 3.5.0-SNAPSHOT: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 3.848 s] [INFO] Spark Project Tags ................................. SUCCESS [ 12.106 s] [INFO] Spark Project Sketch ............................... SUCCESS [ 10.685 s] [INFO] Spark Project Local DB ............................. SUCCESS [ 8.743 s] [INFO] Spark Project Networking ........................... SUCCESS [ 9.362 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 7.828 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 9.071 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 4.776 s] [INFO] Spark Project Core ................................. FAILURE [ 17.228 s] ``` - Run `build/mvn clean install -DskipTests -Pscala-2.13` with Java8 + Scala 2.13 There are compilation errors as `ERROR] -release is only supported on Java 9 and higher` although it does not cause compilation failures. More, I saw davidB/scala-maven-plugin#686, So it seems that 4.8.1 and Java 8 are not compatible well. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Pass GA Closes #40482 from LuciferYang/REVERT-SPARK-42809. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Kent Yao <yao@apache.org>
### What changes were proposed in this pull request? Downgrade scala-maven-plugin.version from 4.8.1 to 4.8.0 and add a note ### Why are the changes needed? see #41228 #40442 1530e8d As mentioned in #40442, there are some regression with the 4.8.1: - Run `./build/mvn -DskipTests clean package` with Java 17 will failed ``` [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) spark-core_2.12 --- [INFO] Not compiling main sources [INFO] [INFO] --- scala-maven-plugin:4.8.1:compile (scala-compile-first) spark-core_2.12 --- [INFO] Compiler bridge file: /home/bjorn/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.8.0-bin_2.12.17__55.0-1.8.0_20221110T195421.jar [INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.17,1.7.10,null) [INFO] compiling 597 Scala sources and 103 Java sources to /home/bjorn/github/spark/core/target/scala-2.12/classes ... [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: not found: value sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: not found: object sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: not found: object sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210: not found: type Unsafe [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212: not found: type Unsafe [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452: not found: value sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: not found: object sun [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type SignalHandler [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83: not found: type Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: type SignalHandler [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: value Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114: not found: type Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116: not found: value Signal [ERROR] [Error] /home/bjorn/github/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128: not found: value Signal [ERROR] 19 errors found [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for Spark Project Parent POM 3.5.0-SNAPSHOT: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 3.848 s] [INFO] Spark Project Tags ................................. SUCCESS [ 12.106 s] [INFO] Spark Project Sketch ............................... SUCCESS [ 10.685 s] [INFO] Spark Project Local DB ............................. SUCCESS [ 8.743 s] [INFO] Spark Project Networking ........................... SUCCESS [ 9.362 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 7.828 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 9.071 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 4.776 s] [INFO] Spark Project Core ................................. FAILURE [ 17.228 s] ``` - Run `build/mvn clean install -DskipTests -Pscala-2.13` with Java8 + Scala 2.13 There are compilation errors as `ERROR] -release is only supported on Java 9 and higher` although it does not cause compilation failures. More, I saw davidB/scala-maven-plugin#686, So it seems that 4.8.1 and Java 8 are not compatible well. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass GA Closes #41261 from bjornjorgensen/revers-scala-maven-plugin. Authored-by: bjornjorgensen <bjornjorgensen@gmail.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
### What changes were proposed in this pull request? This pr downgrade `scala-maven-plugin` to version 4.7.1 to avoid it automatically adding the `-release` option as a Scala compilation argument. ### Why are the changes needed? The `scala-maven-plugin` versions 4.7.2 and later will try to automatically append the `-release` option as a Scala compilation argument when it is not specified by the user: 1. 4.7.2 and 4.8.0: try to add the `-release` option for Scala versions 2.13.9 and higher. 2. 4.8.1: try to append the `-release` option for Scala versions 2.12.x/2.13.x/3.1.1, and append `-java-output-version` for Scala 3.1.2. The addition of the `-release` option has caused issues mentioned in SPARK-44376 | #41943 and #40442 (comment). This is because the `-release` option has stronger compilation restrictions than `-target`, ensuring not only bytecode format, but also that the API used in the code is compatible with the specified version of Java. However, many APIs in the `sun.*` package are not `exports` in Java 11, 17, and 21, such as `sun.nio.ch.DirectBuffer`, `sun.util.calendar.ZoneInfo`, and `sun.nio.cs.StreamDecoder`, making them invisible when compiling across different versions. For discussions within the Scala community, see scala/bug#12643, scala/bug#12824, scala/bug#12866, but this is not a bug. I have also submitted an issue to the `scala-maven-plugin` community to discuss the possibility of adding additional settings to control the addition of the `-release` option: davidB/scala-maven-plugin#722. For Apache Spark 4.0, in the short term, I suggest downgrading `scala-maven-plugin` to version 4.7.1 to avoid it automatic adding the `-release` option as a Scala compilation argument. In the long term, we should reduce use of APIs that are not `exports` for compatibility with the `-release` compilation option due to `-target` already deprecated after Scala 2.13.9. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass GitHub Actions - Manual check run `git revert 656bf36` to revert to using Scala 2.13.11 and run `dev/change-scala-version.sh 2.13` to change Scala to 2.13 1. Run `build/mvn clean install -DskipTests -Pscala-2.13 -X` to check the Scala compilation arguments. Before ``` [[DEBUG] [zinc] Running cached compiler 1992eaf4 for Scala compiler version 2.13.11 [DEBUG] [zinc] The Scala compiler is invoked with: -unchecked -deprecation -feature -explaintypes -target:jvm-1.8 -Wconf:cat=deprecation:wv,any:e -Wunused:imports -Wconf:cat=scaladoc:wv -Wconf:cat=lint-multiarg-infix:wv -Wconf:cat=other-nullary-override:wv -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s -Wconf:msg=method without a parameter list overrides a method with a single empty one:s -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e -Wconf:cat=unchecked&msg=outer reference:s -Wconf:cat=unchecked&msg=eliminated by erasure:s -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s -Wconf:msg=Implicit definition should have explicit type:s -release 8 -bootclasspath ... ``` After ``` [DEBUG] [zinc] Running cached compiler 72dd4888 for Scala compiler version 2.13.11 [DEBUG] [zinc] The Scala compiler is invoked with: -unchecked -deprecation -feature -explaintypes -target:jvm-1.8 -Wconf:cat=deprecation:wv,any:e -Wunused:imports -Wconf:cat=scaladoc:wv -Wconf:cat=lint-multiarg-infix:wv -Wconf:cat=other-nullary-override:wv -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s -Wconf:msg=method without a parameter list overrides a method with a single empty one:s -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e -Wconf:cat=unchecked&msg=outer reference:s -Wconf:cat=unchecked&msg=eliminated by erasure:s -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s -Wconf:msg=Implicit definition should have explicit type:s -target:8 -bootclasspath ... ``` After downgrading the version, the `-release` option should no longer appear in the compilation arguments. 2. Maven can build the project with Java 17 without the issue described in #41943. And after this pr, we can re-upgrade Scala 2.13 to Scala 2.13.11. ### Was this patch authored or co-authored using generative AI tooling? No Closes #42899 from LuciferYang/SPARK-45144. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>

What changes were proposed in this pull request?
The pr aims to upgrade scala-maven-plugin from 4.8.0 to 4.8.1.
Why are the changes needed?
Routine upgrade.
davidB/scala-maven-plugin@4.8.0...4.8.1
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Pass GA.